Analysis of Complexity Bounds Random Sets for Pac-learning With

نویسندگان

  • E. M. Oblow
  • V. R. R. Uppuluri
چکیده

Learnability in Valiant’s pac-learning formalism is reformulated in terms of expected (average) error instead of confidence and error parameters. A finite-domain, random set formalism is introduced to develop algorithm-dependent, distributionspecific analytic error estimates. Two random set theorems for finite concept-spaces are presented to facilitate these developments. Analyses are carried out for several illustrative problems with worst-case and semi-uriiform distributions of learning examples. Analytic bounds on the sample size needed to achieve a specified average error axe established. Useful approximations for these bounds and a worst-case distribution are also derived. Conclusions are drawn about the potential value of average-error bounds in improving the stated efficiency of pac-learning algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sample Complexity of Composite Likelihood

We present the first PAC bounds for learning parameters of Conditional Random Fields [12] with general structures over discrete and real-valued variables. Our bounds apply to composite likelihood [14], which generalizes maximum likelihood and pseudolikelihood [3]. Moreover, we show that the only existing algorithm with a PAC bound for learning high-treewidth discrete models [1] can be viewed as...

متن کامل

Adaptive Online Learning

We propose a general framework for studying adaptive regret bounds in the online learning framework, including model selection bounds and data-dependent bounds. Given a dataor model-dependent bound we ask, “Does there exist some algorithm achieving this bound?” We show that modifications to recently introduced sequential complexity measures can be used to answer this question by providing suffi...

متن کامل

A Tighter Error Bound for Decision Tree Learning Using PAC Learnability

Error bounds for decision trees are generally based on depth or breadth of the tree. In this paper, we propose a bound for error rate that depends both on the depth and the breadth of a specific decision tree constructed from the training samples. This bound is derived from sample complexity estimate based on PAC learnability. The proposed bound is compared with other traditional error bounds o...

متن کامل

The Optimal Sample Complexity of PAC Learning

This work establishes a new upper bound on the number of samples sufficient for PAC learning in the realizable case. The bound matches known lower bounds up to numerical constant factors. This solves a long-standing open problem on the sample complexity of PAC learning. The technique and analysis build on a recent breakthrough by Hans Simon.

متن کامل

une approche PAC-Bayésienne PAC-Bayesian Statistical Learning Theory

This PhD thesis is a mathematical study of the learning task – specifically classification and least square regression – in order to better understand why an algorithm works and to propose more efficient procedures. The thesis consists in four papers. The first one provides a PAC bound for the L generalization error of methods based on combining regression procedures. This bound is tight to the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003